داستان آبیدیک

softmax activation function


فارسی

1 برق و الکترونیک:: تابع فعالیت softmax

The output layer is also very similar, but it uses a softmax activation function instead of a ReLU activation function. Also note that logits is the output of the neural network before going through the softmax activation function: for optimization reasons, we will handle the softmax computation later. , the output of the network before going through the softmax activation function), and it expects labels in the form of integers ranging from 0 to the number of classes minus 1 (in our case, from 0 to 9). The sparse_softmax_cross_entropy_with_logits() function is equivalent to applying the softmax activation function and then computing the cross entropy, but it is more efficient, and it prop‐ erly takes care of corner cases like logits equal to 0. This is why we did not apply the softmax activation function earlier.

واژگان شبکه مترجمین ایران


معنی‌های پیشنهادی کاربران

نام و نام خانوادگی
شماره تلفن همراه
متن معنی یا پیشنهاد شما
Captcha Code